43 research outputs found

    Modelling structural coverage and the number of failure occurrences with non-homogeneous Markov chains

    Get PDF
    Most software reliability growth models specify the expected number of failures experienced as a function of testing effort or calendar time. However, there are approaches to model the development of intermediate factors driving failure occurrences. This paper starts out with presenting a model framework consisting of four consecutive relationships. It is shown that a differential equation representing this framework is a generalization of several finite failures category models. The relationships between the number of test cases executed and expected structural coverage, and between expected structural coverage and the expected number of failure occurrences are then explored further. A non-homogeneous Markov model allowing for partial redundancy in sampling code constructs is developed. The model bridges the gap between setups related to operational testing and systematic testing, respectively. Two extensions of the model considering the development of the number of failure occurrences are discussed. The paper concludes with showing that the extended models fit into the structure of the differential equation presented at the beginning, which permits further interpretation. --

    On J.M. Keynes' The principal averages and the laws of error which lead to them: refinement and generalisation

    Get PDF
    Keynes (1911) derived general forms of probability density functions for which the “most probable value” is given by the arithmetic mean, the geometric mean, the harmonic mean, or the median. His approach was based on indirect (i.e., posterior) distributions and used a constant prior distribution for the parameter of interest. It was therefore equivalent to maximum likelihood (ML) estimation, the technique later introduced by Fisher (1912). Keynes' results suffer from the fact that he did not discuss the supports of the distributions, the sets of possible parameter values, and the normalising constants required to make sure that the derived functions are indeed densities. Taking these aspects into account, we show that several of the distributions proposed by Keynes reduce to well-known ones, like the exponential, the Pareto, and a special case of the generalised inverse Gaussian distribution. Keynes' approach based on the arithmetic, the geometric, and the harmonic mean can be generalised to the class of quasi-arithmetic means. This generalisation allows us to derive further results. For example, assuming that the ML estimator of the parameter of interest is the exponential mean of the observations leads to the most general form of an exponential family with location parameter introduced by Dynkin (1961) and Ferguson (1962, 1963). --ML estimator,criterion function,median,quasi-arithmetic mean,exponential family

    On a method for mending time to failure distributions

    Get PDF
    Many software reliability growth models assume that the time to next failure may be infinite; i.e., there is a chance that no failure will occur at all. For most software products this is too good to be true even after the testing phase. Moreover, if a non-zero probability is assigned to an infinite time to failure, metrics like the mean time to failure do not exist. In this paper, we try to answer several questions: Under what condition does a model permit an infinite time to next failure? Why do all finite failures non-homogeneous Poisson process (NHPP) models share this property? And is there any transformation mending the time to failure distributions? Indeed, such a transformation exists; it leads to a new family of NHPP models. We also show how the distribution function of the time to first failure can be used for unifying finite failures and infinite failures NHPP models. --software reliability growth model,non-homogeneous Poisson process,defective distribution,(mean) time to failure,model unification

    An Empirical Analysis of System-generated Data in Location-based Crowdsourcing

    Get PDF
    This paper develops a research model explaining how task location and incentives affect the take up and, for those tasks that are processed, the time to start. For an empirical analysis, we use the system-generated data of all 1860 location-based crowdsourcing tasks in Berlin available on the Streetspotr platform within one year. The results indicate that while the population density of the task location does not influence the probability that some crowdworker will eventually process the task, a task located in a more densely-populated area tends to be taken up more quickly. Moreover, the take-up probability is expected to increase as the monetary and non-monetary incentives are raised. However, both increasing the monetary incentives and lowering the non-monetary incentives tends to shorten the time to start. This suggests that high non-monetary incentives with which unattractive tasks are endowed do not entice the crowdworkers to quickly set about processing these tasks

    Analysis of Software Aging in a Web Server

    Get PDF
    A number of recent studies have reported the phenomenon of “software aging”, characterized by progressive performance degradation and/or an increased occurrence rate of hang/crash failures of a software system due to the exhaustion of operating system resources or the accumulation of errors. To counteract this phenomenon, a proactive technique called 'software rejuvenation' has been proposed. It essentially involves stopping the running software, cleaning its internal state and/or its environment and then restarting it. Software rejuvenation, being preventive in nature, begs the question as to when to schedule it. Periodic rejuvenation, while straightforward to implement, may not yield the best results, because the rate at which software ages is not constant, but it depends on the time-varying system workload. Software rejuvenation should therefore be planned and initiated in the face of the actual system behavior. This requires the measurement, analysis and prediction of system resource usage. In this paper, we study the development of resource usage in a web server while subjecting it to an artificial workload. We first collect data on several system resource usage and activity parameters. Non-parametric statistical methods are then applied for detecting and estimating trends in the data sets. Finally, we fit time series models to the data collected. Unlike the models used previously in the research on software aging, these time series models allow for seasonal patterns, and we show how the exploitation of the seasonal variation can help in adequately predicting the future resource usage. Based on the models employed here, proactive management techniques like software rejuvenation triggered by actual measurements can be built. --Software aging,software rejuvenation,Linux,Apache,web server,performance monitoring,prediction of resource utilization,non-parametric trend analysis,time series analysis

    Über Belegungs-, Couponsammler- und Komiteeprobleme

    Get PDF
    Für die Käufer von Sammelbildern stellt sich häufig die Frage, wie viele Käufe sie tätigen müssen, um eine bestimmte Anzahl von Bildern, die zu Gruppen in Tüten verpackt sind, zu erhalten. Zur Lösung dieser und ähnlicher Fragen untersuchen wir Verallgemeinerungen des Belegungsproblems (Occupancy Problem) und des Couponsammlerproblems (Coupon Collector's Problem). Während in den Grundmodellen jeweils nur ein Element (eine Sammelkarte) gezogen wird, berücksichtigt das Komiteeproblem (Committee Problem) die gleichzeitige Auswahl mehrerer unterschiedlicher Elemente. In diesem Sinne verallgemeinern wir auch das Couponsammlerproblem. Unter Verwendung von Ansätzen der Stichprobentheorie und der Kombinatorik erweitern wir schliessich die Modelle, um für die einzelnen Bilder individuelle Auftrittswahrscheinlichkeiten erlauben zu können. --Urnenmodelle,Occupancy Problem,Coupon Collector's Problem,Committee,Problem,Ziehen ohne Zurücklegen,Auswahl mit unterschiedlichenWahrscheinlichkeiten

    Unravelling Ariadne’s Thread: Exploring the Threats of Decentralised DNS

    Get PDF
    The current landscape of the core Internet technologies shows considerable centralisation with the big tech companies controlling the vast majority of traffic and services. This situation has sparked a wide range of decentralisation initiatives with blockchain technology being among the most prominent and successful innovations. At the same time, over the past years there have been considerable attempts to address the security and privacy issues affecting the Domain Name System (DNS). To this end, it is claimed that Blockchain-based DNS may solve many of the limitations of traditional DNS. However, such an alternative comes with its own security concerns and issues, as any introduction and adoption of a new technology typically does - let alone a disruptive one. In this work we present the emerging threat landscape of blockchain-based DNS and we empirically validate the threats with real-world data. Specifically, we explore a part of the blockchain DNS ecosystem in terms of the browser extensions using such technologies, the chain itself (Namecoin and Emercoin), the domains, and users who have been registered in these platforms. Our findings reveal several potential domain extortion attempts and possible phishing schemes. Finally, we suggest countermeasures to address the identified threats, and we identify emerging research themes

    Extended Coagulation Profiling in Isolated Traumatic Brain Injury:A CENTER-TBI Analysis

    Get PDF
    Background: Trauma-induced coagulopathy in traumatic brain injury (TBI) remains associated with high rates of complications, unfavorable outcomes, and mortality. The underlying mechanisms are largely unknown. Embedded in the prospective multinational Collaborative European Neurotrauma Effectiveness Research in Traumatic Brain Injury (CENTER-TBI) study, coagulation profiles beyond standard conventional coagulation assays were assessed in patients with isolated TBI within the very early hours of injury. Methods: Results from blood samples (citrate/EDTA) obtained on hospital admission were matched with clinical and routine laboratory data of patients with TBI captured in the CENTER-TBI central database. To minimize confounding factors, patients with strictly isolated TBI (iTBI) (n = 88) were selected and stratified for coagulopathy by routine international normalized ratio (INR): (1) INR &lt; 1.2 and (2) INR ≥ 1.2. An INR &gt; 1.2 has been well adopted over time as a threshold to define trauma-related coagulopathy in general trauma populations. The following parameters were evaluated: quick’s value, activated partial thromboplastin time, fibrinogen, thrombin time, antithrombin, coagulation factor activity of factors V, VIII, IX, and XIII, protein C and S, plasminogen, D-dimer, fibrinolysis-regulating parameters (thrombin activatable fibrinolysis inhibitor, plasminogen activator inhibitor 1, antiplasmin), thrombin generation, and fibrin monomers. Results: Patients with iTBI with INR ≥ 1.2 (n = 16) had a high incidence of progressive intracranial hemorrhage associated with increased mortality and unfavorable outcome compared with patients with INR &lt; 1.2 (n = 72). Activity of coagulation factors V, VIII, IX, and XIII dropped on average by 15–20% between the groups whereas protein C and S levels dropped by 20%. With an elevated INR, thrombin generation decreased, as reflected by lower peak height and endogenous thrombin potential (ETP), whereas the amount of fibrin monomers increased. Plasminogen activity significantly decreased from 89% in patients with INR &lt; 1.2 to 76% in patients with INR ≥ 1.2. Moreover, D-dimer levels significantly increased from a mean of 943 mg/L in patients with INR &lt; 1.2 to 1,301 mg/L in patients with INR ≥ 1.2. Conclusions: This more in-depth analysis beyond routine conventional coagulation assays suggests a counterbalanced regulation of coagulation and fibrinolysis in patients with iTBI with hemostatic abnormalities. We observed distinct patterns involving key pathways of the highly complex and dynamic coagulation system that offer windows of opportunity for further research. Whether the changes observed on factor levels may be relevant and explain the worse outcome or the more severe brain injuries by themselves remains speculative.</p
    corecore